Probability Theory
Notation
- 𝑋: Random Variable
- 𝑥: Realization of 𝑋
- 𝒳: Alphabet or Domain of 𝑋
Function
- $p_X(x)$: Probability Mass Function (pmf) of 𝑋
$$
\begin{aligned}
p_{X}(x) &= P[X = x]
\end{aligned}
$$
- $p_{X, Y}(x, y)$: Joint pmf of 𝑋, 𝑌
$$
\begin{aligned}
p_{X, Y}(x, y) = P[X = x, Y = y] \\
\begin{cases}
p_{X}(x) = \sum_{y \in \mathcal{Y}} p_{X, Y}(x, y) \\
p_{Y}(y) = \sum_{x \in \mathcal{X}} p_{X, Y}(x, y)
\end{cases}
\end{aligned}
$$
Independence
- 𝑋, 𝑌 are independent if and only if:
$$
\begin{aligned}
p_{X, Y}(x, y) = p_{X}(x) \cdot p_{Y}(y)
\end{aligned}
$$
- (X1, ..., Xn) are independent and identically distributed(i.i.d.) then:
$$
\begin{aligned}
p_{X_1, \ldots, X_n}(x_1, \ldots, x_n) = \prod_{i=1}^{n} p_{X}(x_i)
\end{aligned}
$$
Expectation
$$
\begin{aligned}
E[X] = \sum_{x \in \mathcal{X}} x \cdot p_{X}(x)
\end{aligned}
$$
- Law of Large Numbers(LLN): If (X1, ..., Xn) are i.i.d., then
$$
\begin{aligned}
\lim_{n \rightarrow \infty} \left( \frac{1}{n} \sum_{i=1}^{n} X_i \right) = E[X]
\end{aligned}
$$
Variance
$$
\begin{aligned}
Var(X) = E \left[ (X - E[X])^2 \right]
\end{aligned}
$$
- Central Limit Theorem(CLT): If (X1, ..., Xn) are i.i.d., then
$$
\begin{aligned}
\lim_{n \rightarrow \infty} \left[ \frac{1}{\sqrt{n}} \sum_{i=1}^{n} (X_i - \mu) \right] = \mathcal{N}(0, Var(X))
\end{aligned}
$$